19,070 research outputs found

    Beta-Testing the “Particular Machine”: The Machine-or-Transformation Test in Peril and Its Impact on Cloud Computing

    Get PDF
    This Issue Brief examines recent cases addressing the patent eligibility of computer-implemented method claims and their implications for the development of cloud computing technologies. Despite the Supreme Court’s refusal to endorse the machine-or-transformation test as the exclusive patent eligibility inquiry, lower courts have continued to invalidate method claims using a stringent “particular machine” requirement alongside the requisite abstract ideas analysis. This Issue Brief argues that 1) post-Bilski v. Kappos cases have failed to elucidate what constitutes a particular machine for computer-implemented methods; 2) in light of substantial variance among Federal Circuit judges’ Section 101 jurisprudence, the application of the particular machine requirement has become subject to a high degree of panel-dependency, such that its relevance for analyzing software method claims has come under question; 3) notwithstanding the unease expressed by practitioners and scholars for the future of cloud computing patents, the courts’ hardening stance toward computer-implemented method claims will do little to deter patenting in the cloud computing context. Instead, clouds delivering platform and software services will remain capable of satisfying the particular machine requirement and supporting patent eligibility, especially given the possible dilution of the particular machine requirement itself

    The Best for Last: The Timing of U.S. Supreme Court Decisions

    Get PDF
    This Article investigates the hypothesis that the most important and, often, controversial and divisive cases—so called big cases—are disproportionately decided at the end of June. We define a big case in one of four ways: front-page coverage in the New York Times; front-page and other coverage in four national newspapers (the New York Times, Los Angeles Times, Washington Post, and Chicago Tribune); the number of amicus curiae briefs filed in a case; and the number of subsequent citations by the Supreme Court to its decision in a case. We find a statistically significant association between each measure of a big case and end-of-term decisions even after controlling for the month of oral argument (cases argued later in the term are more likely to be decided near the end of the term) and case attributes (e.g., dissents and concurrences) that increase the time it takes to decide a case. We also speculate on why big cases cluster at the end of the term. One possibility is legacy and reputational concerns: when writing what they think will be a major decision, the Justices and their law clerks take more time polishing until the last minute with the hope of promoting their reputations. Another is that the end-of-term clustering of the most important cases may tend to diffuse media coverage of and other commentary regarding any particular case, and thus spare the Justices unwanted criticism just before they leave Washington for their summer recess

    Object-oriented construction of a multigrid electronic-structure code with Fortran 90

    Get PDF
    We describe the object-oriented implementation of a higher-order finite-difference density-functional code in Fortran 90. Object-oriented models of grid and related objects are constructed and employed for the implementation of an efficient one-way multigrid method we have recently proposed for the density-functional electronic-structure calculations. Detailed analysis of performance and strategy of the one-way multigrid scheme will be presented.Comment: 24 pages, 6 figures, to appear in Comput. Phys. Com

    A Wave Pattern Analysis For EEG Signal Based Upon A Successive Digital Smoothing Process

    Get PDF

    Xenobiotic Detoxication Potential and Drug Induced Changes in Carbaryl Toxicity in the Alfalfa Leafcutter Bee, \u3cem\u3eMegachile rotundata\u3c/em\u3e (Febricius)

    Get PDF
    The alfalfa leafcutter bee, Megachile rotundata, one of the most important pollinators of alfalfa, generally became more susceptible to carbaryl as the adults aged. LD50 values for carbaryl toxicity were 240,166, 109, and 51 μg carbaryl/gram for the 1, 2, 3, and 4 days-old adult male bees. For the female bees, these were 245 , 551, 289, and 262 μg car-baryl/ gram. Lipid content and microsomal enzyme activity (measured by EPN detoxication in vitro) decreased in both sexes as the bees aged. Synergist ratios, defined as the ratio of the LD50 of carbaryl alone to the LD50 of carbaryl applied after the synergist piperonyl butoxide, were measured. These did not correlate with either decreasing LD50 values or decreasing EPN detoxication in the males being 12, 21, 42, and 53 for the 1, 2, 3, and 4-day-old bees. In female bees the synergist ratio corresponded more closely being 23 , 49 , 11, and 15 as the females aged from 1 to 4 days. Published literature has suggested that such synergist ratios are usable as in vitro estimates of detoxication ability, higher values indicating more detoxication potential, but there may be difficulties in the proper estimation of the synergist ratio. The use of enzyme-inducing drugs for protection of the bees from carbaryl poisoning was studied. Among the three drugs used, only chlorcyclizine increased the LD50 value (to about two fold) in the 4-day-old male. Aminopyrine increased the male\u27s susceptibility to carbaryl while phenobarbital had no effect at all . None of the drugs significantly changed the female\u27s susceptibility to carbaryl except aminopyrine which lowered the LD50 from about 262 to 75 μg / gr. Studies with in vitro EPN detoxication indicated that all three drugs caused increases in microsomal enzyme activity up to four or five fold and decreases in lipid content of about 20 to 30%

    The Symmetry-Decision Method of EEG Analysis

    Get PDF
    An improved version of a previously presented computer analysis for EEG is described. The basic process used in the analysis is the classification of waves according to duration (frequency) and amplitude. The unique aspect of the improved system is the distinction between simple waves and composite waves, which are slow waves with superimposed higher frequencies. The computer program makes this decision on the basis of wave symmetry

    Tunneling decay of false vortices

    Full text link
    We consider the decay of vortices trapped in the false vacuum of a theory of scalar electrodynamics in 2+1 dimensions. The potential is inspired by models with intermediate symmetry breaking to a metastable vacuum that completely breaks a U(1) symmetry, while in the true vacuum the symmetry is unbroken. The false vacuum is unstable through the formation of true vacuum bubbles; however, the rate of decay can be extremely long. On the other hand, the false vacuum can contain metastable vortex solutions. These vortices contain the true vacuum inside in addition to a unit of magnetic flux and the appropriate topologically nontrivial false vacuum outside. We numerically establish the existence of vortex solutions which are classically stable; however, they can decay via tunneling. In general terms, they tunnel to a configuration which is a large, thin-walled vortex configuration that is now classically unstable to the expansion of its radius. We compute an estimate for the tunneling amplitude in the semi-classical approximation. We believe our analysis would be relevant to superconducting thin films or superfluids.Comment: 27 pages, 9 figure

    ACHIEVING EFFICIENCY AND EQUITY IN IRRIGATION MANAGEMENT: AN OPTIMIZATION MODEL OF THE EL ANGEL WATERSHED, CARCHI, ECUADOR

    Get PDF
    The objective of this paper is to address the problems of inefficiency and inequity in water allocation in the El Angel watershed, located in Ecuador's Sierra region. Water is captured in a high-altitude region of the watershed and distributed downstream to producers in four elevation-defined zones via a system of canals. Upstream and downstream producers face radically different conditions with respect to climate and terrain. A mathematical programming model was created to study the consequences of addressing chronic water scarcity problems in the watershed by shifting water resources between the four zones. The model captures the nature of water use by humans, crops and dual purpose cattle. Its objective function maximizes producer welfare as measured by aggregate gross margin, subject to limited supplies of land, labor and water. Five water allocation scenarios are evaluated with respect to efficiency in land and water use and equity in income distribution. Results reveal that although water is the primary constrained resource downstream, in the upstream zones, land is far more scarce. The current distribution of water rights does not consider these differences and therefore is neither efficient nor equitable. Improvements in efficiency (resource use) and equity (income distribution) are associated with (1) a shift of water to the lower zone, and (2) the use of lower levels of irrigation intensity upstream. Furthermore, the scenarios that result in the most efficient use of resources also bring the greatest degree of equity in income distribution, indicating that these may be complementary, not conflicting, goals.Mathematical programming, water allocation, efficiency, equity, Resource /Energy Economics and Policy,
    corecore